Serveur d'exploration sur le chant choral et la santé

Attention, ce site est en cours de développement !
Attention, site généré par des moyens informatiques à partir de corpus bruts.
Les informations ne sont donc pas validées.

Emotions over time: synchronicity and development of subjective, physiological, and facial affective reactions to music.

Identifieur interne : 000232 ( Main/Exploration ); précédent : 000231; suivant : 000233

Emotions over time: synchronicity and development of subjective, physiological, and facial affective reactions to music.

Auteurs : Oliver Grewe [Allemagne] ; Frederik Nagel ; Reinhard Kopiez ; Eckart Altenmüller

Source :

RBID : pubmed:18039047

Descripteurs français

English descriptors

Abstract

Most people are able to identify basic emotions expressed in music and experience affective reactions to music. But does music generally induce emotion? Does it elicit subjective feelings, physiological arousal, and motor reactions reliably in different individuals? In this interdisciplinary study, measurement of skin conductance, facial muscle activity, and self-monitoring were synchronized with musical stimuli. A group of 38 participants listened to classical, rock, and pop music and reported their feelings in a two-dimensional emotion space during listening. The first entrance of a solo voice or choir and the beginning of new sections were found to elicit interindividual changes in subjective feelings and physiological arousal. Quincy Jones' "Bossa Nova" motivated movement and laughing in more than half of the participants. Bodily reactions such as "goose bumps" and "shivers" could be stimulated by the "Tuba Mirum" from Mozart's Requiem in 7 of 38 participants. In addition, the authors repeated the experiment seven times with one participant to examine intraindividual stability of effects. This exploratory combination of approaches throws a new light on the astonishing complexity of affective music listening.

DOI: 10.1037/1528-3542.7.4.774
PubMed: 18039047


Affiliations:


Links toward previous steps (curation, corpus...)


Le document en format XML

<record>
<TEI>
<teiHeader>
<fileDesc>
<titleStmt>
<title xml:lang="en">Emotions over time: synchronicity and development of subjective, physiological, and facial affective reactions to music.</title>
<author>
<name sortKey="Grewe, Oliver" sort="Grewe, Oliver" uniqKey="Grewe O" first="Oliver" last="Grewe">Oliver Grewe</name>
<affiliation wicri:level="1">
<nlm:affiliation>Institute of Music Physiology and Musicians' Medicine, Hannover University of Music and Drama, Germany.</nlm:affiliation>
<country xml:lang="fr">Allemagne</country>
<wicri:regionArea>Institute of Music Physiology and Musicians' Medicine, Hannover University of Music and Drama</wicri:regionArea>
<wicri:noRegion>Hannover University of Music and Drama</wicri:noRegion>
<wicri:noRegion>Hannover University of Music and Drama</wicri:noRegion>
<wicri:noRegion>Hannover University of Music and Drama</wicri:noRegion>
</affiliation>
</author>
<author>
<name sortKey="Nagel, Frederik" sort="Nagel, Frederik" uniqKey="Nagel F" first="Frederik" last="Nagel">Frederik Nagel</name>
</author>
<author>
<name sortKey="Kopiez, Reinhard" sort="Kopiez, Reinhard" uniqKey="Kopiez R" first="Reinhard" last="Kopiez">Reinhard Kopiez</name>
</author>
<author>
<name sortKey="Altenmuller, Eckart" sort="Altenmuller, Eckart" uniqKey="Altenmuller E" first="Eckart" last="Altenmüller">Eckart Altenmüller</name>
</author>
</titleStmt>
<publicationStmt>
<idno type="wicri:source">PubMed</idno>
<date when="2007">2007</date>
<idno type="RBID">pubmed:18039047</idno>
<idno type="pmid">18039047</idno>
<idno type="doi">10.1037/1528-3542.7.4.774</idno>
<idno type="wicri:Area/Main/Corpus">000224</idno>
<idno type="wicri:explorRef" wicri:stream="Main" wicri:step="Corpus" wicri:corpus="PubMed">000224</idno>
<idno type="wicri:Area/Main/Curation">000221</idno>
<idno type="wicri:explorRef" wicri:stream="Main" wicri:step="Curation">000221</idno>
<idno type="wicri:Area/Main/Exploration">000221</idno>
</publicationStmt>
<sourceDesc>
<biblStruct>
<analytic>
<title xml:lang="en">Emotions over time: synchronicity and development of subjective, physiological, and facial affective reactions to music.</title>
<author>
<name sortKey="Grewe, Oliver" sort="Grewe, Oliver" uniqKey="Grewe O" first="Oliver" last="Grewe">Oliver Grewe</name>
<affiliation wicri:level="1">
<nlm:affiliation>Institute of Music Physiology and Musicians' Medicine, Hannover University of Music and Drama, Germany.</nlm:affiliation>
<country xml:lang="fr">Allemagne</country>
<wicri:regionArea>Institute of Music Physiology and Musicians' Medicine, Hannover University of Music and Drama</wicri:regionArea>
<wicri:noRegion>Hannover University of Music and Drama</wicri:noRegion>
<wicri:noRegion>Hannover University of Music and Drama</wicri:noRegion>
<wicri:noRegion>Hannover University of Music and Drama</wicri:noRegion>
</affiliation>
</author>
<author>
<name sortKey="Nagel, Frederik" sort="Nagel, Frederik" uniqKey="Nagel F" first="Frederik" last="Nagel">Frederik Nagel</name>
</author>
<author>
<name sortKey="Kopiez, Reinhard" sort="Kopiez, Reinhard" uniqKey="Kopiez R" first="Reinhard" last="Kopiez">Reinhard Kopiez</name>
</author>
<author>
<name sortKey="Altenmuller, Eckart" sort="Altenmuller, Eckart" uniqKey="Altenmuller E" first="Eckart" last="Altenmüller">Eckart Altenmüller</name>
</author>
</analytic>
<series>
<title level="j">Emotion (Washington, D.C.)</title>
<idno type="ISSN">1528-3542</idno>
<imprint>
<date when="2007" type="published">2007</date>
</imprint>
</series>
</biblStruct>
</sourceDesc>
</fileDesc>
<profileDesc>
<textClass>
<keywords scheme="KwdEn" xml:lang="en">
<term>Adolescent (MeSH)</term>
<term>Adult (MeSH)</term>
<term>Affect (MeSH)</term>
<term>Aged (MeSH)</term>
<term>Aging (physiology)</term>
<term>Blushing (MeSH)</term>
<term>Child (MeSH)</term>
<term>Facial Expression (MeSH)</term>
<term>Female (MeSH)</term>
<term>Humans (MeSH)</term>
<term>Male (MeSH)</term>
<term>Middle Aged (MeSH)</term>
<term>Music (MeSH)</term>
<term>Sensation (MeSH)</term>
<term>Sweating (MeSH)</term>
<term>Tears (MeSH)</term>
<term>Time Factors (MeSH)</term>
</keywords>
<keywords scheme="KwdFr" xml:lang="fr">
<term>Adolescent (MeSH)</term>
<term>Adulte (MeSH)</term>
<term>Adulte d'âge moyen (MeSH)</term>
<term>Affect (MeSH)</term>
<term>Enfant (MeSH)</term>
<term>Expression faciale (MeSH)</term>
<term>Facteurs temps (MeSH)</term>
<term>Femelle (MeSH)</term>
<term>Humains (MeSH)</term>
<term>Larmes (MeSH)</term>
<term>Musique (MeSH)</term>
<term>Mâle (MeSH)</term>
<term>Sensation (MeSH)</term>
<term>Sudation (MeSH)</term>
<term>Sujet âgé (MeSH)</term>
<term>Vieillissement (physiologie)</term>
<term>Érubescence (MeSH)</term>
</keywords>
<keywords scheme="MESH" qualifier="physiologie" xml:lang="fr">
<term>Vieillissement</term>
</keywords>
<keywords scheme="MESH" qualifier="physiology" xml:lang="en">
<term>Aging</term>
</keywords>
<keywords scheme="MESH" xml:lang="en">
<term>Adolescent</term>
<term>Adult</term>
<term>Affect</term>
<term>Aged</term>
<term>Blushing</term>
<term>Child</term>
<term>Facial Expression</term>
<term>Female</term>
<term>Humans</term>
<term>Male</term>
<term>Middle Aged</term>
<term>Music</term>
<term>Sensation</term>
<term>Sweating</term>
<term>Tears</term>
<term>Time Factors</term>
</keywords>
<keywords scheme="MESH" xml:lang="fr">
<term>Adolescent</term>
<term>Adulte</term>
<term>Adulte d'âge moyen</term>
<term>Affect</term>
<term>Enfant</term>
<term>Expression faciale</term>
<term>Facteurs temps</term>
<term>Femelle</term>
<term>Humains</term>
<term>Larmes</term>
<term>Musique</term>
<term>Mâle</term>
<term>Sensation</term>
<term>Sudation</term>
<term>Sujet âgé</term>
<term>Érubescence</term>
</keywords>
</textClass>
</profileDesc>
</teiHeader>
<front>
<div type="abstract" xml:lang="en">Most people are able to identify basic emotions expressed in music and experience affective reactions to music. But does music generally induce emotion? Does it elicit subjective feelings, physiological arousal, and motor reactions reliably in different individuals? In this interdisciplinary study, measurement of skin conductance, facial muscle activity, and self-monitoring were synchronized with musical stimuli. A group of 38 participants listened to classical, rock, and pop music and reported their feelings in a two-dimensional emotion space during listening. The first entrance of a solo voice or choir and the beginning of new sections were found to elicit interindividual changes in subjective feelings and physiological arousal. Quincy Jones' "Bossa Nova" motivated movement and laughing in more than half of the participants. Bodily reactions such as "goose bumps" and "shivers" could be stimulated by the "Tuba Mirum" from Mozart's Requiem in 7 of 38 participants. In addition, the authors repeated the experiment seven times with one participant to examine intraindividual stability of effects. This exploratory combination of approaches throws a new light on the astonishing complexity of affective music listening.</div>
</front>
</TEI>
<pubmed>
<MedlineCitation Status="MEDLINE" Owner="NLM">
<PMID Version="1">18039047</PMID>
<DateCompleted>
<Year>2008</Year>
<Month>02</Month>
<Day>19</Day>
</DateCompleted>
<DateRevised>
<Year>2007</Year>
<Month>11</Month>
<Day>27</Day>
</DateRevised>
<Article PubModel="Print">
<Journal>
<ISSN IssnType="Print">1528-3542</ISSN>
<JournalIssue CitedMedium="Print">
<Volume>7</Volume>
<Issue>4</Issue>
<PubDate>
<Year>2007</Year>
<Month>Nov</Month>
</PubDate>
</JournalIssue>
<Title>Emotion (Washington, D.C.)</Title>
</Journal>
<ArticleTitle>Emotions over time: synchronicity and development of subjective, physiological, and facial affective reactions to music.</ArticleTitle>
<Pagination>
<MedlinePgn>774-88</MedlinePgn>
</Pagination>
<Abstract>
<AbstractText>Most people are able to identify basic emotions expressed in music and experience affective reactions to music. But does music generally induce emotion? Does it elicit subjective feelings, physiological arousal, and motor reactions reliably in different individuals? In this interdisciplinary study, measurement of skin conductance, facial muscle activity, and self-monitoring were synchronized with musical stimuli. A group of 38 participants listened to classical, rock, and pop music and reported their feelings in a two-dimensional emotion space during listening. The first entrance of a solo voice or choir and the beginning of new sections were found to elicit interindividual changes in subjective feelings and physiological arousal. Quincy Jones' "Bossa Nova" motivated movement and laughing in more than half of the participants. Bodily reactions such as "goose bumps" and "shivers" could be stimulated by the "Tuba Mirum" from Mozart's Requiem in 7 of 38 participants. In addition, the authors repeated the experiment seven times with one participant to examine intraindividual stability of effects. This exploratory combination of approaches throws a new light on the astonishing complexity of affective music listening.</AbstractText>
</Abstract>
<AuthorList CompleteYN="Y">
<Author ValidYN="Y">
<LastName>Grewe</LastName>
<ForeName>Oliver</ForeName>
<Initials>O</Initials>
<AffiliationInfo>
<Affiliation>Institute of Music Physiology and Musicians' Medicine, Hannover University of Music and Drama, Germany.</Affiliation>
</AffiliationInfo>
</Author>
<Author ValidYN="Y">
<LastName>Nagel</LastName>
<ForeName>Frederik</ForeName>
<Initials>F</Initials>
</Author>
<Author ValidYN="Y">
<LastName>Kopiez</LastName>
<ForeName>Reinhard</ForeName>
<Initials>R</Initials>
</Author>
<Author ValidYN="Y">
<LastName>Altenmüller</LastName>
<ForeName>Eckart</ForeName>
<Initials>E</Initials>
</Author>
</AuthorList>
<Language>eng</Language>
<PublicationTypeList>
<PublicationType UI="D016428">Journal Article</PublicationType>
<PublicationType UI="D013485">Research Support, Non-U.S. Gov't</PublicationType>
</PublicationTypeList>
</Article>
<MedlineJournalInfo>
<Country>United States</Country>
<MedlineTA>Emotion</MedlineTA>
<NlmUniqueID>101125678</NlmUniqueID>
<ISSNLinking>1528-3542</ISSNLinking>
</MedlineJournalInfo>
<CitationSubset>IM</CitationSubset>
<MeshHeadingList>
<MeshHeading>
<DescriptorName UI="D000293" MajorTopicYN="N">Adolescent</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName UI="D000328" MajorTopicYN="N">Adult</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName UI="D000339" MajorTopicYN="Y">Affect</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName UI="D000368" MajorTopicYN="N">Aged</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName UI="D000375" MajorTopicYN="N">Aging</DescriptorName>
<QualifierName UI="Q000502" MajorTopicYN="Y">physiology</QualifierName>
</MeshHeading>
<MeshHeading>
<DescriptorName UI="D001821" MajorTopicYN="N">Blushing</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName UI="D002648" MajorTopicYN="N">Child</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName UI="D005149" MajorTopicYN="Y">Facial Expression</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName UI="D005260" MajorTopicYN="N">Female</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName UI="D006801" MajorTopicYN="N">Humans</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName UI="D008297" MajorTopicYN="N">Male</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName UI="D008875" MajorTopicYN="N">Middle Aged</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName UI="D009146" MajorTopicYN="Y">Music</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName UI="D012677" MajorTopicYN="N">Sensation</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName UI="D013546" MajorTopicYN="N">Sweating</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName UI="D013666" MajorTopicYN="N">Tears</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName UI="D013997" MajorTopicYN="N">Time Factors</DescriptorName>
</MeshHeading>
</MeshHeadingList>
</MedlineCitation>
<PubmedData>
<History>
<PubMedPubDate PubStatus="pubmed">
<Year>2007</Year>
<Month>11</Month>
<Day>28</Day>
<Hour>9</Hour>
<Minute>0</Minute>
</PubMedPubDate>
<PubMedPubDate PubStatus="medline">
<Year>2008</Year>
<Month>2</Month>
<Day>20</Day>
<Hour>9</Hour>
<Minute>0</Minute>
</PubMedPubDate>
<PubMedPubDate PubStatus="entrez">
<Year>2007</Year>
<Month>11</Month>
<Day>28</Day>
<Hour>9</Hour>
<Minute>0</Minute>
</PubMedPubDate>
</History>
<PublicationStatus>ppublish</PublicationStatus>
<ArticleIdList>
<ArticleId IdType="pubmed">18039047</ArticleId>
<ArticleId IdType="pii">2007-17748-012</ArticleId>
<ArticleId IdType="doi">10.1037/1528-3542.7.4.774</ArticleId>
</ArticleIdList>
</PubmedData>
</pubmed>
<affiliations>
<list>
<country>
<li>Allemagne</li>
</country>
</list>
<tree>
<noCountry>
<name sortKey="Altenmuller, Eckart" sort="Altenmuller, Eckart" uniqKey="Altenmuller E" first="Eckart" last="Altenmüller">Eckart Altenmüller</name>
<name sortKey="Kopiez, Reinhard" sort="Kopiez, Reinhard" uniqKey="Kopiez R" first="Reinhard" last="Kopiez">Reinhard Kopiez</name>
<name sortKey="Nagel, Frederik" sort="Nagel, Frederik" uniqKey="Nagel F" first="Frederik" last="Nagel">Frederik Nagel</name>
</noCountry>
<country name="Allemagne">
<noRegion>
<name sortKey="Grewe, Oliver" sort="Grewe, Oliver" uniqKey="Grewe O" first="Oliver" last="Grewe">Oliver Grewe</name>
</noRegion>
</country>
</tree>
</affiliations>
</record>

Pour manipuler ce document sous Unix (Dilib)

EXPLOR_STEP=$WICRI_ROOT/Sante/explor/SanteChoraleV4/Data/Main/Exploration
HfdSelect -h $EXPLOR_STEP/biblio.hfd -nk 000232 | SxmlIndent | more

Ou

HfdSelect -h $EXPLOR_AREA/Data/Main/Exploration/biblio.hfd -nk 000232 | SxmlIndent | more

Pour mettre un lien sur cette page dans le réseau Wicri

{{Explor lien
   |wiki=    Sante
   |area=    SanteChoraleV4
   |flux=    Main
   |étape=   Exploration
   |type=    RBID
   |clé=     pubmed:18039047
   |texte=   Emotions over time: synchronicity and development of subjective, physiological, and facial affective reactions to music.
}}

Pour générer des pages wiki

HfdIndexSelect -h $EXPLOR_AREA/Data/Main/Exploration/RBID.i   -Sk "pubmed:18039047" \
       | HfdSelect -Kh $EXPLOR_AREA/Data/Main/Exploration/biblio.hfd   \
       | NlmPubMed2Wicri -a SanteChoraleV4 

Wicri

This area was generated with Dilib version V0.6.37.
Data generation: Sat Oct 10 10:36:24 2020. Site generation: Sat Oct 10 10:37:38 2020